skip to main content


Search for: All records

Creators/Authors contains: "Vitak, Jessica"

Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher. Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?

Some links on this page may take you to non-federal websites. Their policies may differ from this site.

  1. Baym, Nancy ; Ellison, Nicole (Ed.)
    Abstract The future of work increasingly focuses on the collection and analysis of worker data to monitor communication, ensure productivity, reduce security threats, and assist in decision-making. The COVID-19 pandemic increased employer reliance on these technologies; however, the blurring of home and work boundaries meant these monitoring tools might also surveil private spaces. To explore workers’ attitudes toward increased monitoring practices, we present findings from a factorial vignette survey of 645 U.S. adults who worked from home during the early months of the pandemic. Using the theory of privacy as contextual integrity to guide the survey design and analysis, we unpack the types of workplace surveillance practices that violate privacy norms and consider attitudinal differences between male and female workers. Our findings highlight that the acceptability of workplace surveillance practices is highly contextual, and that reductions in privacy and autonomy at work may further exacerbate power imbalances, especially for vulnerable employees. 
    more » « less
    Free, publicly-accessible full text available June 12, 2024
  2. There is a rich literature on technology’s role in facilitating employee monitoring in the workplace. The COVID-19 pandemic created many challenges for employers, and many companies turned to new forms of monitoring to ensure remote workers remained productive; however, these technologies raise important privacy concerns as the boundaries between work and home are further blurred. In this paper, we present findings from a study of 645 US workers who spent at least part of 2020 working remotely due to the pandemic. We explore how their work experiences (job satisfaction, stress, and security) changed between January and November 2020, as well as their attitudes toward and concerns about being monitored. Findings support anecdotal evidence that the pandemic has had an uneven effect on workers, with women reporting more negative effects on their work experiences. In addition, while nearly 40% of workers reported their employer began using new surveillance tools during the pandemic, a significant percentage were unsure, suggesting there is confusion or a lack of transparency regarding how new policies are communicated to staff. We consider these findings in light of prior research and discuss the benefits and drawbacks of various approaches to minimize surveillance-related worker harms. 
    more » « less
  3. Social media provides unique opportunities for researchers to learn about a variety of phenomena—it is often publicly available, highly accessible, and affords more naturalistic observation. However, as research using social media data has increased, so too has public scrutiny, highlighting the need to develop ethical approaches to social media data use. Prior work in this area has explored users’ perceptions of researchers’ use of social media data in the context of a single platform. In this paper, we expand on that work, exploring how platforms and their affordances impact how users feel about social media data reuse. We present results from three factorial vignette surveys, each focusing on a different platform—dating apps, Instagram, and Reddit—to assess users’ comfort with research data use scenarios across a variety of contexts. Although our results highlight different expectations between platforms depending on the research domain, purpose of research, and content collected, we find that the factor with the greatest impact across all platforms is consent—a finding which presents challenges for big data researchers. We conclude by offering a sociotechnical approach to ethical decision-making. This approach provides recommendations on how researchers can interpret and respond to platform norms and affordances to predict potential data use sensitivities. The approach also recommends that researchers respond to the predominant expectation of notification and consent for research participation by bolstering awareness of data collection on digital platforms. 
    more » « less
  4. In recent years, the CHI community has seen significant growth in research on Human-Centered Responsible Artificial Intelligence. While different research communities may use different terminol- ogy to discuss similar topics, all of this work is ultimately aimed at developing AI that benefits humanity while being grounded in human rights and ethics, and reducing the potential harms of AI. In this special interest group, we aim to bring together researchers from academia and industry interested in these topics to map cur- rent and future research trends to advance this important area of research by fostering collaboration and sharing ideas. 
    more » « less
  5. null (Ed.)
    Research using online datasets from social media platforms continues to grow in prominence, but recent research suggests that platform users are sometimes uncomfortable with the ways their posts and content are used in research studies. While previous research has suggested that a variety of contextual variables may influence this discomfort, such factors have yet to be isolated and compared. In this article, we present results from a factorial vignette survey of American Facebook users. Findings reveal that researcher domain, content type, purpose of data use, and awareness of data collection all impact respondents’ comfort—measured via judgments of acceptability and concern—with diverse data uses. We provide guidance to researchers and ethics review boards about the ways that user reactions to research uses of their data can serve as a cue for identifying sensitive data types and uses. 
    more » « less
  6. Frequent public uproar over forms of data science that rely on information about people demonstrates the challenges of defining and demonstrating trustworthy digital data research practices. This paper reviews problems of trustworthiness in what we term pervasive data research: scholarship that relies on the rich information generated about people through digital interaction. We highlight the entwined problems of participant unawareness of such research and the relationship of pervasive data research to corporate datafication and surveillance. We suggest a way forward by drawing from the history of a different methodological approach in which researchers have struggled with trustworthy practice: ethnography. To grapple with the colonial legacy of their methods, ethnographers have developed analytic lenses and researcher practices that foreground relations of awareness and power. These lenses are inspiring but also challenging for pervasive data research, given the flattening of contexts inherent in digital data collection. We propose ways that pervasive data researchers can incorporate reflection on awareness and power within their research to support the development of trustworthy data science. 
    more » « less
  7. null (Ed.)
    The COVID-19 global pandemic led governments, health agencies, and technology companies to work on solutions to minimize the spread of the disease. One such solution concerns contact-tracing apps whose utility is tied to widespread adoption. Using survey data collected a few weeks into lockdown measures in the United States, we explore Americans’ willingness to install a COVID-19 tracking app. Specifically, we evaluate how the distributor of such an app (e.g., government, health-protection agency, technology company) affects people’s willingness to adopt the tool. While we find that 67 percent of respondents are willing to install an app from at least one of the eight providers included, the factors that predict one’s willingness to adopt differ. Using Nissenbaum’s theory of privacy as contextual integrity, we explore differences in responses across distributors and discuss why some distributors may be viewed as less appropriate than others in the context of providing health-related apps during a global pandemic. We conclude the paper by providing policy recommendations for wide-scale data collection that minimizes the likelihood that such tools violate the norms of appropriate information flows. 
    more » « less
  8. null (Ed.)
    The global coronavirus pandemic has raised important questions regarding how to balance public health concerns with privacy protections for individual citizens. In this essay, we evaluate contact tracing apps, which have been offered as a technological solution to minimize the spread of COVID-19. We argue that apps such as those built on Google and Apple’s “exposure notification system” should be evaluated in terms of the contextual integrity of information flows; in other words, the appropriateness of sharing health and location data will be contextually dependent on factors such as who will have access to data, as well as the transmission principles underlying data transfer. We also consider the role of prevailing social and political values in this assessment, including the large-scale social benefits that can be obtained through such information sharing. However, caution should be taken in violating contextual integrity, even in the case of a pandemic, because it risks a long-term loss of autonomy and growing function creep for surveillance and monitoring technologies. 
    more » « less
  9. null (Ed.)
    Researchers and policymakers advocate teaching children about digital privacy, but privacy literacy has not been theorized for children. Drawing on interviews with 30 families, including 40 children, we analyze children’s perspectives on password management in three contexts—family life, friendship, and education—and develop a new approach to privacy literacy grounded in Nissenbaum’s contextual integrity framework. Contextual integrity equates privacy with appropriate flows of information, and we show how children’s perceptions of the appropriateness of disclosing a password varied across contexts. We explain why privacy literacy should focus on norms rather than rules and discuss how adults can use learning moments to strengthen children’s privacy literacy. We argue that equipping children to make privacy-related decisions serves them better than instructing them to follow privacy-related rules. 
    more » « less